This triple correspondence between logical
instructions, the action of the mind, and a machine which could in
principle be embodied in a practical physical form, was Turing's
definitive contribution. Having made this novel definition of what
should count as a 'definite method' — in modern language, an
algorithm — it was not too hard to answer Hilbert's question in the
negative: no such decision procedure exists.
In April 1936 he showed his result to Newman; but at the same
moment the parallel conclusion of the American logician Alonzo
Church became known, and Turing was robbed of the full reward for
his originality. His paper, On Computable Numbers with an
application to the Entscheidungsproblem, had to refer to
Church's work, and was delayed until August 1936. However it was
seen at the time that Turing's approach was original and different;
Church relied upon an assumption internal to mathematics, rather
than appealing to operations that could actually be done by real
things or people in the physical world. Subsequently, the concept of
the Turing machine has become the foundation of the modern
theory of computation and computability.
Turing worked in isolation from the powerful school of logical
theory centred on Church at Princeton University, and his work
emerged as that of a complete outsider. One can only speculate, but
it looks as if Turing found in the concept of the Turing machine
something that would satisfy the fascination with the problem of
Mind that Christopher Morcom had sparked; his total originality lay
in seeing the relevance of mathematical logic to a problem
originally seen as one of physics. In this paper, as in so many
aspects of his life, Turing made a bridge between the logical and
the physical worlds, thought and action, which crossed conventional
boundaries.
His work introduced a concept of immense practical significance:
the idea of the Universal Turing Machine. The concept of 'the Turing
machine' is like that of 'the formula' or 'the equation'; there is
an infinity of possible Turing machines, each corresponding to a
different 'definite method' or algorithm. But imagine, as Turing
did, each particular algorithm written out as a set of instructions
in a standard form. Then the work of interpreting the instructions
and carrying them out is itself a mechanical process, and
so can itself be embodied in a particular Turing machine,
namely the Universal Turing machine. A Universal Turing
machine can be made do what any other particular Turing machine
would do, by supplying it with the standard form describing that
Turing machine. One machine, for all possible tasks.
It is hard now not to think of a Turing machine as a computer
program, and the mechanical task of interpreting and obeying the
program as what the computer itself does. Thus, the Universal Turing
Machine embodies the essential principle of the computer: a single
machine which can be turned to any well-defined task by being
supplied with the appropriate program.
Additionally, the abstract Universal Turing Machine naturally
exploits what was later seen as the 'stored program' concept
essential to the modern computer: it embodies the crucial
twentieth-century insight that symbols representing instructions are
no different in kind from symbols representing numbers. But
computers, in this modern sense, did not exist in 1936. Turing
created these concepts out of his mathematical imagination. Only
nine years later would electronic technology be tried and tested
sufficiently to make it practical to transfer the logic of his ideas
into actual engineering. In the meanwhile the idea lived only in his
mind.
In common with other outstanding young scientists, Turing spent
two years at Princeton University enrolled as a graduate student. He
arrived in September 1936. On Computable Numbers... was
published at the very end of 1936 and attracted some attention; by
the time he left, the idea had come to the attention of the leading
Hungarian-American mathematician John von Neumann. But Turing
certainly did not shoot to fame. He worked on on algebra and number
theory; on showing that his definition of computability coincided
with that of Church; and on an extension of his ideas (Ordinal
Logics) which provided a Ph.D. thesis.
The work on 'ordinal logics', probably his most difficult and
deepest mathematical work, was an attempt to bring some kind of
order to the realm of the uncomputable. This also was
connected to the question of the nature of mind, as Turing's
interpretation of his ideas suggested that human 'intuition' could
correspond to uncomputable steps in an argument. But Turing never
pursued this line of development after 1938. Instead, he was
increasingly preoccupied with more immediate problems which demanded
logical skills.
True to the concreteness of the Turing machine, he also spent
time at Princeton making a cipher machine based on using
electromagnetic relays to multiply binary numbers. Even then he saw
a link from 'useless' logic to practical computation. Although not
one of the political intellectuals of the 1930s, Turing followed
current events and was influenced in studying ciphers by the
prospect of war with Germany.